Tensor Completion

نویسندگان

  • Weijia Shao
  • Rainer Gemulla
  • Gerhard Weikum
چکیده

The purpose of this thesis is to explore the methods to solve the tensor completion problem. Inspired by the matrix completion problem, the tensor completion problem is formulated as an unconstrained nonlinear optimization problem, which finds three factors that give a low-rank approximation. Various of iterative methods, including the gradient-based methods, stochastic gradient descent method and the alternating least squares method, were applied to solving tensor completion problem, and were analyzed in case of taking the squared loss with L2 regularization as the loss function. Aiming at the large-scale problem, the author proposed for each of these methods a parallelization strategy. Numerical experiments using synthetic data and real world benchmark demonstrated the speedup of the parallelization, and compared the relative performance as well as the scalability of these methods. The results suggests that both stochastic gradient descent method and alternating least squares method have advantages in solving the large-scale tensor completion problem.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Neuron Mathematical Model Representation of Neural Tensor Network for RDF Knowledge Base Completion

In this paper, a state-of-the-art neuron mathematical model of neural tensor network (NTN) is proposed to RDF knowledge base completion problem. One of the difficulties with the parameter of the network is that representation of its neuron mathematical model is not possible. For this reason, a new representation of this network is suggested that solves this difficulty. In the representation, th...

متن کامل

DERIVATIONS OF TENSOR PRODUCT OF SIMPLE C*-ALGEBRAS

In this paper we study the properties of derivations of A B, where A and B are simple separable C*-algebras, and A B is the C*-completion of A B with respect to a C*-norm Yon A B and we will characterize the derivations of A B in terms of the derivations of A and B

متن کامل

Tensor Completion Algorithms in Big Data Analytics

Tensor completion is a problem of €lling the missing or unobserved entries of partially observed tensors. Due to the multidimensional character of tensors in describing complex datasets, tensor completion algorithms and their applications have received wide aŠention and achievement in data mining, computer vision, signal processing, and neuroscience, etc. In this survey, we provide a modern ove...

متن کامل

Beyond Low Rank: A Data-Adaptive Tensor Completion Method

Low rank tensor representation underpins much of recent progress in tensor completion. In real applications, however, this approach is confronted with two challenging problems, namely (1) tensor rank determination; (2) handling real tensor data which only approximately fulfils the low-rank requirement. To address these two issues, we develop a data-adaptive tensor completion model which explici...

متن کامل

Efficient tensor completion: Low-rank tensor train

This paper proposes a novel formulation of the tensor completion problem to impute missing entries of data represented by tensors. The formulation is introduced in terms of tensor train (TT) rank which can effectively capture global information of tensors thanks to its construction by a wellbalanced matricization scheme. Two algorithms are proposed to solve the corresponding tensor completion p...

متن کامل

5D reconstruction via robust tensor completion

Tensor completion techniques (including tensor denoising) can be used to solve the ubiquitous multidimensional data reconstruction problem. We present a robust tensor reconstruction method that can tolerate the presence of erratic noise. The method is derived by minimizing a robust cost function with the addition of low rank constraints. Our presentation is based on the Parallel Matrix Factoriz...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012